Search results for "FACIAL EXPRESSIONS"
showing 10 items of 15 documents
Event-related potentials to unattended changes in facial expressions: detection of regularity violations or encoding of emotions?
2013
Visual mismatch negativity (vMMN), a component in event-related potentials (ERPs), can be elicited when rarely presented “deviant” facial expressions violate regularity formed by repeated “standard” faces. vMMN is observed as differential ERPs elicited between the deviant and standard faces. It is not clear, however, whether differential ERPs to rare emotional faces interspersed with repeated neutral ones reflect true vMMN (i.e., detection of regularity violation) or merely encoding of the emotional content in the faces. Furthermore, a face-sensitive N170 response, which reflects structural encoding of facial features, can be modulated by emotional expressions. Owing to its similar latency …
Increased amygdala and parahippocampal gyrus activation in schizophrenic patients with auditory hallucinations: An fMRI study using independent compo…
2010
Objective: Hallucinations in patients with schizophrenia have strong emotional connotations. Functional neuroimaging techniques have been widely used to study brain activity in patients with schizophrenia with hallucinations or emotional impairments. However, few of these Studies have investigated the association between hallucinations and emotional dysfunctions using an emotional auditory paradigm. Independent component analysis (ICA) is an analysis method that is especially useful for decomposing activation during complex cognitive tasks in which multiple operations occur simultaneously. Our aim in this Study is to analyze brain activation after the presentation of emotional auditory stim…
Visual processing in patients with age-related macular degeneration performing a face detection test
2017
Purpose: People with age-related macular degeneration (AMD) have difficulties in familiar face recognition and facial expression discrimination. Our aim was to evaluate the visual processing of faces in AMD patients and whether this would be improved by anti-vascular endothelial growth factor therapy. This was a prospective interventional cohort study. Patients: Twelve patients with monocular wet AMD and 6 control subjects were recruited. Face detection processes were studied using cortical event-related potentials (ERPs). Patients received 3 bevacizumab intravitreal injections to the single affected eye. At baseline and 4–6 weeks after the last injection, clinical presentation and ERPs of …
Visual exploration of face and facial expression in infancy: A qualitative approach of cognitive and social development
2016
International audience; This article proposes a methodological consideration for the use of "head free" eye-tracking systems, which allowed to extend this technique to the study of infant skills. It explores how the technological developments enable a more qualitative approach, which offers the possibility of considering "how" in addition to "how long" the infant looks at a visual scene, especially the scene of the face.
Identity–expression interaction in face perception: Sex, visual field, and psychophysical factors
2012
International audience; We investigated the psychophysical factors underlying the identity-emotion interaction in face perception. Visual field and sex were also taken into account. Participants had to judge whether a probe face, presented in either the left or the right visual field, and a central target face belonging to same person while emotional expression varied (Experiment 1) or to judge whether probe and target faces expressed the same emotion while identity was manipulated (Experiment 2). For accuracy we replicated the mutual facilitation effect between identity and emotion; no sex or hemispheric differences were found. Processing speed measurements, however, showed a lesser degree…
Brain's change detection elicited by emotional facial expressions in depressed and non-depressed individuals
2009
Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa
2018
In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar cate…
Le traitement des expressions faciales au cours de la première année : développement et rôle de l'olfaction
2015
The first year of life is critical for the development of the abilities to process facial expressions. Olfaction and expressions are both strongly linked to each other, and it is well known that infants are able to multisensorially integrate their environment as early as birth. However, most of the studies interested in multisensory processing of facial expressions are restricted to the investigation of audio-visual interactions.In this thesis, we firstly aimed to resolve different issues concerning the ontogenesis of infants’ ability to process facial expressions. Our results allowed to specify the development of visual exploratory strategies of facial emotions along the first year of life…
The impact of visual working memory capacity on the filtering efficiency of emotional face distractors.
2018
Emotional faces can serve as distractors for visual working memory (VWM) tasks. An event-related potential called contralateral delay activity (CDA) can measure the filtering efficiency of face distractors. Previous studies have investigated the influence of VWM capacity on filtering efficiency of simple neutral distractors but not of face distractors. We measured the CDA indicative of emotional face filtering during a VWM task related to facial identity. VWM capacity was measured in a separate colour change detection task, and participants were divided to high- and low-capacity groups. The high-capacity group was able to filter out distractors similarly irrespective of its facial emotion. …
Emotional communication in the context of joint attention for food stimuli: Effects on attentional and affective processing
2014
Guided by distinct theoretical frameworks (the embodiment theories, shared-signal hypothesis, and appraisal theories), we examined the effects of gaze direction and emotional expressions (joy, disgust, and neutral) of virtual characters on attention orienting and affective reactivity of participants while they were engaged in joint attention for food stimuli contrasted by preference (disliked, moderately liked, and liked). The participants were exposed to videos of avatars looking at food and displaying facial expressions with their gaze directed either toward the food only or toward the food and participants consecutively. We recorded eye-tracking responses, heart rate, facial electromyogr…